Syntax-based Attention Model for Natural Language Inference
نویسندگان
چکیده
Introducing attentional mechanism in neural network is a powerful concept, and has achieved impressive results in many natural language processing tasks. However, most of the existing models impose attentional distribution on a flat topology, namely the entire input representation sequence. Clearly, any well-formed sentence has its accompanying syntactic tree structure, which is a much rich topology. Applying attention to such topology not only exploits the underlying syntax, but also makes attention more interpretable. In this paper, we explore this direction in the context of natural language inference. The results demonstrate its efficacy. We also perform extensive qualitative analysis, deriving insights and intuitions of why and how our model works.
منابع مشابه
Natural Language Syntax and First-Order Inference
We have argued elsewhere that rst order inference can be made more eecient by using non-standard syntax for rst order logic. In this paper we deene a syntax for rst order logic based on the structure of natural language under Montague semantics. We show that, for a certain fairly expressive fragment of this language, satissability is polynomial time decidable. The polynomial time decision proce...
متن کاملNatural Language Syntax and First Order In
We have argued elsewhere that rst order inference can be made more eecient by using non-standard syntax for rst order logic. In this paper we deene a syntax for rst order logic based on the structure of natural language under Montague semantics. We show that, for a certain fairly expressive fragment of this language, satissability is polynomial time decidable. The polynomial time decision proce...
متن کاملThe Syntax, Semantics, and Inference Mechanism of Natural Language
It is both desirable and plausible to treat natural language itself as a "knowledge representation (KR) formalism. Every KR formalism has syntax and support certain inferences. The syntax of a KR formalism specifies the form in which knowledge must be encoded, and its inference mechanism depends on its syntax. If natural language is a formalism for representing knowledge, then its syntax provid...
متن کاملProbabilistic Grammars and Hierarchical Dirichlet Processes
Probabilistic context-free grammars (PCFGs) have played an important role in the modeling of syntax in natural language processing and other applications, but choosing the proper model complexity is often difficult. We present a nonparametric Bayesian generalization of the PCFG based on the hierarchical Dirichlet process (HDP). In our HDP-PCFG model, the effective complexity of the grammar can ...
متن کاملCode-Copying in the Balochi Language of Sistan
This empirical study deals with language contact phenomena in Sistan. Code-copying is viewed as a strategy of linguistic behavior when a dominated language acquires new elements in lexicon, phonology, morphology, syntax, pragmatic organization, etc., which can be interpreted as copies of a dominating language. In this framework Persian is regarded as the model code which provides elements for b...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1607.06556 شماره
صفحات -
تاریخ انتشار 2016